Goto

Collaborating Authors

 nlp capability


IBM Sets Its NLP Ambitions High With New Capabilities In Watson

#artificialintelligence

Boosting its NLP capabilities, IBM has launched new innovative capabilities in IBM Watson Discovery and IBM Watson Assistant, which will empower businesses to deploy and scale sophisticated AI systems. It will leverage NLP with accuracy and efficiency, all while requiring fewer data and training time. It is another significant step by the tech giant to offer advanced ability to understand the language of business. With an aim to bring better NLP and NLU offerings to users in its enterprise products, the company has yet again shown its drive to take NLP efforts to a newer height. While recent announcements by IBM focus around language, explainability, and workplace automation, the update around its language capabilities include reading comprehension, FAQ extraction and improving interactions in Watson Assistant.


2020 is Conversational AI -- trends into the future

#artificialintelligence

Conversational Artificial Intelligence (AI) empowers enterprises to employ chatbots, messaging applications or virtual assistants to build highly engaging and valuable relationships with customers. This cutting-edge technology is spreading rapidly across every industry and more excitingly, providing enterprises with a huge potential to accelerate their growth and innovation. From whatever little it has seen of AI assistants so far, the enterprise world has built great expectations. They are visualizing a future full of bots, so smart and powerful, that they help humans with almost any kind of support required very substantially. No wonder conversational AI has become the much-awaited technology in today's enterprise world and attracting the attention of business leaders across the globe. But how much of it is hype, and how much is really closer to reality?


IBM Watson Just Analysed a TV Debate. Read to Know How

#artificialintelligence

Bloomberg Television's show "That's Debatable" had an unusual participant on its show broadcasted on October 9. In a debate on the topic "Is it time to redistribute the world's wealth?", IBM Watson synthesised thousands of responses and opinions received from the public to incorporate into the debate. IBM Watson used a new natural language processing feature called key point analysis which categorises and summarises thousands of public opinions to a handful of concrete key points. Key point analysis is basically the next generation of'extractive summarisation' which processes statements in a given text document to summarise the most significant points.


Understanding BERT

#artificialintelligence

While BERT is a significant improvement in how computers'understand' human language, it is still far away from understanding language and context in the same way that humans do. We should, however, expect that BERT will have a significant impact on many understanding focused NLP initiatives. The General Language Understanding Evaluation benchmark (GLUE) is a collection of datasets used for training, evaluating, and analyzing NLP models relative to one another. The datasets are designed to test a model's language understanding and are useful for evaluating models like BERT. As the GLUE results show, BERT makes it possible to outperform humans even in comprehension tasks previously thought to be impossible for computers to outperform humans.


NLP for Analytics: It's Not Just About Text - InformationWeek

#artificialintelligence

Organizations have been using natural language processing (NLP) for text analytics to identify patterns in data such as social media sentiment and contract review, but NLP usage has been expanding. "The big change that's happened in the last five years is the amount of context and understanding that can be extracted or used when understanding documents," said Nigel Duffy, global artificial intelligence leader at EY. "Our ability to understand information from documents is much, much greater than it was a few years ago." BI and analytics vendors are adding NLP capabilities to their products such as natural language generation for data visualization narration and natural language understanding for natural language searches. In doing all of this, they're making data visualizations easier to understand and their products easier to use. For example, Tableau, Sisense, and Qlik have all partnered with Narrative Science to narrate data visualizations with text.


Natural Language Processing Examples in Government Data

#artificialintelligence

Tom is an analyst at the US Department of Defense (DoD).1 All day long, he and his team collect and process massive amounts of data from a variety of sources--weather data from the National Weather Service, traffic information from the US Department of Transportation, military troop movements, public website comments, and social media posts--to assess potential threats and inform mission planning. While some of the information Tom's group collects is structured and can be categorized easily (such as tropical storms in progress or active military engagements), the vast majority is simply unstructured text, including social media conversations, comments on public websites, and narrative reports filed by field agents. Because the data is unstructured, it's difficult to find patterns and draw meaningful conclusions. Tom and his team spend much of their day poring over paper and digital documents to detect trends, patterns, and activity that could raise red flags. In response to these kinds of challenges, DoD's Defense Advanced Research Projects Agency (DARPA) recently created the Deep Exploration and Filtering of Text (DEFT) program, which uses natural language processing (NLP), a form of artificial intelligence, to automatically extract relevant information and help analysts derive actionable insights from it.2 Across government, whether in defense, transportation, human services, public safety, or health care, agencies struggle with a similar problem--making sense out of huge volumes of unstructured text to inform decisions, improve services, and save lives.


How to Balance AI Capabilities with UX Design when Building Chatbots

#artificialintelligence

Chatbot use cases and their associated user bases vary greatly. In some cases, you may be automating a complex enterprise workflow that you're tailoring for the technically savvy. On the other hand, you may be building a chatbot that helps schedule flights and may end up dealing with some users that are novices from a technical standing. Understanding the use case for your chatbot and the needs and capabilities of your users is critical to finding the right balance between AI capabilities and UX design. In the case of the chatbot that schedules flights, you may have more success incorporating response buttons and visuals that guide users through its capabilities at the expense of text-based user responses that can be aggregated to boost AI, ML, and NLP functionality.